“School of Cognitive Sciences”

Back to Papers Home
Back to Papers of School of Cognitive Sciences

Paper   IPM / Cognitive Sciences / 11608
School of Cognitive Sciences
  Title:   Evidence-Based Mixture of MLP-Experts
  Author(s): 
1.  Saeed Masoudnia
2.  Mohammad Rostami
3.  Mahdi Tabassian
4.  Atena Sajedin
5.  Reza Ebrahimpour
  Status:   In Proceedings
  Proceeding: International joint conference on neural networks 2010
  Year:  2010
  Supported by:  IPM
  Abstract:
Mixture of Experts (ME) is a modular neural network architecture for supervised learning. In this paper, we propose an evidence-based ME to deal with the classification problem. In the basic form of ME the problem space is automatically divided into several subspaces for the experts and the outputs of experts are combined by a gating network. Satisfactory performance of the basic ME depends on thediversity among experts. In conventional ME, different initialization of experts and supervision of the gating network during the learning procedure , provide the diversity. The main idea of our proposed method is to employ the Dempster-Shafer (D-S) theory of evidence to improve determination of learning parameters (which results more diverse experts) and the way of combining experts' decisions.Experimental results with some data sets from uci epository show that our proposed method yields better classification rates as compared to basic ME and static combining of neural network based on D-S theory.

Download TeX format
back to top
scroll left or right